1 |
Visualization of Decision Processes Using a Cognitive Architecture
|
|
|
|
In: DTIC (2013)
|
|
BASE
|
|
Show details
|
|
2 |
Toward Determining the Comprehensibility of Machine Translations
|
|
|
|
In: DTIC (2012)
|
|
BASE
|
|
Show details
|
|
4 |
Cognitive Tools for Humanoid Robots in Space
|
|
|
|
In: DTIC AND NTIS (2004)
|
|
BASE
|
|
Show details
|
|
5 |
Finding the FOO: A Pilot Study for a Multimodal Interface
|
|
|
|
In: DTIC AND NTIS (2003)
|
|
BASE
|
|
Show details
|
|
6 |
Spatial Language for Human-Robot Dialogs
|
|
|
|
In: DTIC AND NTIS (2003)
|
|
BASE
|
|
Show details
|
|
7 |
An Agent Driven Human-centric Interface for Autonomous Mobile Robots
|
|
|
|
In: DTIC AND NTIS (2003)
|
|
BASE
|
|
Show details
|
|
8 |
Using Spatial Language in a Human-Robot Dialog
|
|
|
|
In: DTIC AND NTIS (2002)
|
|
BASE
|
|
Show details
|
|
9 |
Multi-modal Interfacing for Human-Robot Interaction
|
|
|
|
In: DTIC AND NTIS (2001)
|
|
BASE
|
|
Show details
|
|
10 |
Using a Natural Language and Gesture Interface for Unmanned Vehicles
|
|
|
|
In: DTIC AND NTIS (2000)
|
|
Abstract:
Unmanned vehicles, such as mobile robots, must exhibit adjustable autonomy. They must be able to be self-sufficient when the situation warrants; however, as they interact with each other and with humans, they must exhibit an ability to dynamically adjust their independence or dependence as co-operative agents attempting to achieve some goal. This is what we mean by adjustable autonomy. We have been investigating various modes of communication that enhance a robot's capability to work interactively with other robots and with humans. Specifically, we have been investigating how natural language and gesture can provide a user-friendly interface to mobile robots. We have extended this initial work to include semantic and pragmatic procedures that allow humans and robots to act co-operatively, based on whether or not goals have been achieved by the various agents in the interaction. By processing commands that are either spoken or initiated by clicking buttons on a Personal Digital Assistant and by gesturing either naturally or symbolically, we are tracking the various goals in the interaction, the agent involved in the interaction, and whether or not the goal has been achieved. The various agents involved in achieving the goals are each aware of their own and others' goals and what goals have been stated or accomplished so that eventually any member of the group, be it a robot or a human, if necessary, can interact with the other members to achieve the stated goals of a mission. ; The original document contains color images.
|
|
Keyword:
*NATURAL LANGUAGE; *ROBOTS; *UNMANNED VEHICLES; COMMUNICATION AND RADIO SYSTEMS; Cybernetics; HUMANS; INTERFACES; Linguistics; MOBILE; TRACKING; UNMANNED; USER FRIENDLY; VEHICLES
|
|
URL: http://oai.dtic.mil/oai/oai?&verb=getRecord&metadataPrefix=html&identifier=ADA435161 http://www.dtic.mil/docs/citations/ADA435161
|
|
BASE
|
|
Hide details
|
|
11 |
Goal Tracking and Goal Attainment: A Natural Language Means of Achieving Adjustable Autonomy
|
|
|
|
In: DTIC AND NTIS (1999)
|
|
BASE
|
|
Show details
|
|
12 |
Goal Tracking in a Natural Language Interface: Towards Achieving Adjustable Autonomy
|
|
|
|
In: DTIC AND NTIS (1999)
|
|
BASE
|
|
Show details
|
|
13 |
Integrating Natural Language and Gesture in a Robotics Domain
|
|
|
|
In: DTIC AND NTIS (1998)
|
|
BASE
|
|
Show details
|
|
16 |
Talking to InterFIS: Adding Speech Input to a Natural Language Interface
|
|
|
|
In: DTIC AND NTIS (1992)
|
|
BASE
|
|
Show details
|
|
18 |
InterFIS: A Natural Language Interface to the Fault Isolation Shell
|
|
|
|
In: DTIC AND NTIS (1990)
|
|
BASE
|
|
Show details
|
|
|
|